114 research outputs found

    Entropy of Highly Correlated Quantized Data

    Get PDF
    This paper considers the entropy of highly correlated quantized samples. Two results are shown. The first concerns sampling and identically scalar quantizing a stationary continuous-time random process over a finite interval. It is shown that if the process crosses a quantization threshold with positive probability, then the joint entropy of the quantized samples tends to infinity as the sampling rate goes to infinity. The second result provides an upper bound to the rate at which the joint entropy tends to infinity, in the case of an infinite-level uniform threshold scalar quantizer and a stationary Gaussian random process. Specifically, an asymptotic formula for the conditional entropy of one quantized sample conditioned on the previous quantized sample is derived. At high sampling rates, these results indicate a sharp contrast between the large encoding rate (in bits/sec) required by a lossy source code consisting of a fixed scalar quantizer and an ideal, sampling-rate-adapted lossless code, and the bounded encoding rate required by an ideal lossy source code operating at the same distortion

    Low-Resolution Scalar Quantization for Gaussian Sources and Absolute Error

    Get PDF
    This correspondence considers low-resolution scalar quantization for a memoryless Gaussian source with respect to absolute error distortion. It shows that slope of the operational rate-distortion function of scalar quantization is infinite at the point Dmax where the rate becomes zero. Thus, unlike the situation for squared error distortion, or for Laplacian and exponential sources with squared or absolute error distortion, for a Gaussian source and absolute error, scalar quantization at low rates is far from the Shannon rate-distortion function, i.e., far from the performance of the best lossy coding technique

    Minimum Conditional Description Length Estimation for Markov Random Fields

    Full text link
    In this paper we discuss a method, which we call Minimum Conditional Description Length (MCDL), for estimating the parameters of a subset of sites within a Markov random field. We assume that the edges are known for the entire graph G=(V,E)G=(V,E). Then, for a subset UVU\subset V, we estimate the parameters for nodes and edges in UU as well as for edges incident to a node in UU, by finding the exponential parameter for that subset that yields the best compression conditioned on the values on the boundary U\partial U. Our estimate is derived from a temporally stationary sequence of observations on the set UU. We discuss how this method can also be applied to estimate a spatially invariant parameter from a single configuration, and in so doing, derive the Maximum Pseudo-Likelihood (MPL) estimate.Comment: Information Theory and Applications (ITA) workshop, February 201

    Row-Centric Lossless Compression of Markov Images

    Full text link
    Motivated by the question of whether the recently introduced Reduced Cutset Coding (RCC) offers rate-complexity performance benefits over conventional context-based conditional coding for sources with two-dimensional Markov structure, this paper compares several row-centric coding strategies that vary in the amount of conditioning as well as whether a model or an empirical table is used in the encoding of blocks of rows. The conclusion is that, at least for sources exhibiting low-order correlations, 1-sided model-based conditional coding is superior to the method of RCC for a given constraint on complexity, and conventional context-based conditional coding is nearly as good as the 1-sided model-based coding.Comment: submitted to ISIT 201

    Low Rate Scalar Quantization for Gaussian Sources and Absolute Error

    Get PDF
    This paper considers low resolution scalar quantization for a memoryless Gaussian source with respect to absolute error distortion. It shows that slope of the operational rate-distortion function of scalar quantization is infinite at the point D_(max) where the rate becomes zero. Thus, unlike the situation for squared error distortion, or for Laplacian and exponential sources with squared or absolute error distortion, for a Gaussian source and absolute error, scalar quantization at low rates is far from the Shannon rate-distortion function, i.e. far from the performance of the best lossy coding technique

    Entropy of Highly Correlated Quantized Data

    Full text link

    Human Umbilical Cord Blood Cells Restore Brain Damage Induced Changes in Rat Somatosensory Cortex

    Get PDF
    Intraperitoneal transplantation of human umbilical cord blood (hUCB) cells has been shown to reduce sensorimotor deficits after hypoxic ischemic brain injury in neonatal rats. However, the neuronal correlate of the functional recovery and how such a treatment enforces plastic remodelling at the level of neural processing remains elusive. Here we show by in-vivo recordings that hUCB cells have the capability of ameliorating the injury-related impairment of neural processing in primary somatosensory cortex. Intact cortical processing depends on a delicate balance of inhibitory and excitatory transmission, which is disturbed after injury. We found that the dimensions of cortical maps and receptive fields, which are significantly altered after injury, were largely restored. Additionally, the lesion induced hyperexcitability was no longer observed in hUCB treated animals as indicated by a paired-pulse behaviour resembling that observed in control animals. The beneficial effects on cortical processing were reflected in an almost complete recovery of sensorimotor behaviour. Our results demonstrate that hUCB cells reinstall the way central neurons process information by normalizing inhibitory and excitatory processes. We propose that the intermediate level of cortical processing will become relevant as a new stage to investigate efficacy and mechanisms of cell therapy in the treatment of brain injury
    corecore